# Wikipedia corpus training
Gn Bert Small Cased
MIT
A BERT model pretrained for Guarani (6 layers, case-sensitive). Trained on Wikipedia + Wiktionary (approx. 800k tokens).
Large Language Model
Transformers Other

G
mmaguero
26
0
Mt5 Zh Ja En Trimmed
A multilingual translation model fine-tuned based on mt5-base, supporting mutual translation between Chinese, Japanese, and English
Machine Translation
Transformers Supports Multiple Languages

M
K024
146
57
Featured Recommended AI Models